Skip to content

Fix for qwen model prompt (#514)#29

Merged
kovtcharov merged 3 commits intomainfrom
kalin/qwen
Mar 26, 2025
Merged

Fix for qwen model prompt (#514)#29
kovtcharov merged 3 commits intomainfrom
kalin/qwen

Conversation

@kovtcharov
Copy link
Copy Markdown
Collaborator

Fix Qwen model prompt formatting:

  • Add Qwen system message to system_messages dictionary
  • Refactor get_system_prompt to use format_chat_history for consistent formatting
  • Fix empty assistant responses and extra newlines in Qwen chat format
  • Fix following error message
[2025-03-24 14:30:04] | ERROR | aiohttp.server.log_exception | web_protocol.py:451 | Error handling request from 127.0.0.1
Traceback (most recent call last):
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_protocol.py", line 480, in _handle_request
    resp = await request_handler(request)
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_app.py", line 569, in _handle
    return await handler(request)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\agent.py", line 204, in _on_prompt_received
    response = self.prompt_received(data["prompt"])
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 41, in prompt_received
    response = self.prompt_llm(prompt)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 31, in prompt_llm
    prompt = Prompts.get_system_prompt(
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\prompts.py", line 264, in get_system_prompt
    system_message = cls.system_messages[model_type]
KeyError: 'qwen'

closes #24

Fix Qwen model prompt formatting:

- Add Qwen system message to system_messages dictionary
- Refactor get_system_prompt to use format_chat_history for consistent
formatting
- Fix empty assistant responses and extra newlines in Qwen chat format
- Fix following error message
```
[2025-03-24 14:30:04] | ERROR | aiohttp.server.log_exception | web_protocol.py:451 | Error handling request from 127.0.0.1
Traceback (most recent call last):
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_protocol.py", line 480, in _handle_request
    resp = await request_handler(request)
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_app.py", line 569, in _handle
    return await handler(request)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\agent.py", line 204, in _on_prompt_received
    response = self.prompt_received(data["prompt"])
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 41, in prompt_received
    response = self.prompt_llm(prompt)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 31, in prompt_llm
    prompt = Prompts.get_system_prompt(
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\prompts.py", line 264, in get_system_prompt
    system_message = cls.system_messages[model_type]
KeyError: 'qwen'
```
@kovtcharov kovtcharov added the bug Something isn't working label Mar 25, 2025
@kovtcharov kovtcharov self-assigned this Mar 25, 2025
@kovtcharov kovtcharov mentioned this pull request Mar 25, 2025
@kovtcharov kovtcharov enabled auto-merge (squash) March 26, 2025 16:23
@kovtcharov kovtcharov merged commit 7d3e77e into main Mar 26, 2025
4 checks passed
@kovtcharov kovtcharov deleted the kalin/qwen branch March 26, 2025 22:59
itomek pushed a commit that referenced this pull request Mar 12, 2026
Fix Qwen model prompt formatting:

- Add Qwen system message to system_messages dictionary
- Refactor get_system_prompt to use format_chat_history for consistent
formatting
- Fix empty assistant responses and extra newlines in Qwen chat format
- Fix following error message
```
[2025-03-24 14:30:04] | ERROR | aiohttp.server.log_exception | web_protocol.py:451 | Error handling request from 127.0.0.1
Traceback (most recent call last):
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_protocol.py", line 480, in _handle_request
    resp = await request_handler(request)
  File "C:\Users\kalin\miniconda3\envs\gaiaenv3\lib\site-packages\aiohttp\web_app.py", line 569, in _handle
    return await handler(request)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\agent.py", line 204, in _on_prompt_received
    response = self.prompt_received(data["prompt"])
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 41, in prompt_received
    response = self.prompt_llm(prompt)
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\app.py", line 31, in prompt_llm
    prompt = Prompts.get_system_prompt(
  File "C:\Users\kalin\Work\gaia3\src\gaia\agents\Chaty\prompts.py", line 264, in get_system_prompt
    system_message = cls.system_messages[model_type]
KeyError: 'qwen'
```

closes #24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Cant start Qwen-1.5

3 participants